# African language model
Llama 3.2 400M Amharic
This is a streamlined version based on Meta's Llama-3.2-1B model, specifically pretrained for Amharic with 400 million parameters and a context length of 1024 tokens.
Large Language Model
Transformers Other

L
rasyosef
310
3
Gpt2 Swahili
A Swahili GPT2 language model trained based on the HuggingFace Flax framework, developed by the JAX/Flax Community Week project
Large Language Model Other
G
flax-community
22
2
Featured Recommended AI Models